Skip to content

Conversation

@serge-p7v
Copy link
Contributor

@serge-p7v serge-p7v commented Jan 27, 2026

Add onLLMPromptTransforming handler that can transform Prompt before passing it to onLLMCallStarting.

Motivation and Context

For some use cases like RAG or chat memory it might be useful to intercept every request to an LLM in order to augment it somehow (with relevant documents from a knowledge base for RAG or with previous messages for chat memory).

Breaking Changes

None.


Type of the changes

  • New feature (non-breaking change which adds functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation update
  • Tests improvement
  • Refactoring
  • CI/CD changes
  • Dependencies update

Checklist

  • The pull request has a description of the proposed change
  • I read the Contributing Guidelines before opening the pull request
  • The pull request uses develop as the base branch
  • Tests for the changes have been added
  • All new and existing tests passed
Additional steps for pull requests adding a new feature
  • An issue describing the proposed change exists
  • The pull request includes a link to the issue
  • The change was discussed and approved in the issue
  • Docs have been added / updated

logger.debug { "Executing LLM call (event id: $eventId, prompt: $prompt, tools: [${tools.joinToString { it.name }}])" }
context.pipeline.onLLMCallStarting(eventId, context.executionInfo, context.runId, prompt, model, tools, context)
logger.debug { "Transforming prompt (event id: $eventId, prompt: $prompt, tools: [${tools.joinToString { it.name }}])" }
val transformedPrompt = context.pipeline.onLLMPromptTransforming(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: other methods here like "executeStreaming" or "moderate" should also call "onLLMPromptTransforming".

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@serge-p7v, could you please clarify the general question here. With this update, we have two events with same set of parameters one after another. It looks a bit redundant for me, if I do not miss anything, as you can transform prompt inside the onLLMCallStarting handler and prompt is a variable in the AIAgentLLMWriteSession that can be updated. Would that work as well? Why do we need a separate interceptor here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, it is not possible to change the prompt inside of onLLMCallStarting (the prompt is immutable); for the intercepting approach we need to modify onLLMCallStarting. There might be two issues here:

  • Prompt transformations through AIAgentLLMWriteSession would change the conversation state. In the PR they are applied to a current call to an LLM (they are kind of "transient").
  • onLLMCallStarting would become a possibly mutating handler and it would be harder to reason about what it can do in a chain. In the PR onLLMPromptTransforming does only prompt transformation, so by chaining multiple onLLMCallStarting with multiple onLLMPromptTransforming we can be sure where the transformation is.

About the same set of parameters: good idea, thank you, most likely the transformer needs only the prompt and the context!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sdubov has suggested a great idea with extracting memory-specific handlers into a different feature. Converting the PR to draft.

…Handler feature that can modify the prompt before sending it to an LLM
@serge-p7v serge-p7v force-pushed the onLLMPromptTransforming branch from 50e06a5 to ae57fe5 Compare January 27, 2026 13:50
@serge-p7v serge-p7v marked this pull request as draft January 27, 2026 17:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants